Complement

So as the first exercise, say this is the probability, let's print the probability of the inverse event. Let's make the function over here that takes p but returns 1 - p.


In [1]:
def f(p):
    return 1-p

print f(0.3)


0.7

Two flips

  • Suppose we have a coin with probability p. For example, p might be 0.5.
  • You flip the coin twice and I want to compute the probability that this coin comes up head and heads in these 2 flips--obviously that's 0.5 times 0.5.

In [2]:
def f(p):
    return p*p

print f(0.3)


0.09

Three Flips

Just like before it will be an input to the function f and now I'm going to flip the coin 3 times and I want you to calculate the probability that the heads comes up exactly once. Three is not a variable so you could only works for 3 flips not for 2 or 4 but the only input variable is going to be the coin probability 0.5.


In [5]:
def f(p):
    return 3 * p * (1-p) * (1-p)

print f(0.5)
print f(0.8)


0.375
0.096

Flip Two Coins

  • So coin 1 has a probability of heads equals P₁ and coin 2 has a probability of heads equals P₂ and this might not be different probabilities.
  • In my programming environment, I can account this by making 2 arguments separated by a comma, for example, 0.5 and 0.8, and then the function takes as an input, 2 arguments, P₁ and P₂, and then I can use both of these variables in the return assignment.
  • Let’s now flip both coins and write the code that computes the probability that coin 1 equals heads and coin 2 equals heads for example of 0.5 and 0.8, this would be?

In [7]:
def f(p1,p2):    
    return p1 * p2

print f(0.5,0.8)


0.4

Flip One Of Two

  • So two coins again, C1, C2. And let's say each coin has its own probability of coming up heads.
  • For the first coin, we're going to call it P1, and for the second, P2. And for reasons that should be clear later, we write it as a conditional.
  • So that means, if the coin you're flipping is C1, then the probability of heads equals P1. If the coin we're flipping is C2, then the probability of heads will be P2.
  • Now, this alludes to the fact that I really want you to pick a coin here. You're going to pick one coin, and the probability of you pick coin one, C1, is P0. And logically, it follows the probability of picking coin two, the other coin, is 1 minus P0.
  • And I'm interested in the probability that heads come up under the scenario where you first pick a coin at random and then flip the coin. And in this exercise, I give you some very concrete numbers. P0 is 0.3, P1 is 0.5, and P2 is 0.9.

In [9]:
def f(p0,p1,p2):    
    return p0 * p1 +(1-p0) * p2

print f(0.3,0.5,0.9)


0.78

Answer

And the answer is 0.78. And the way I got this, you might have picked point C1. That happens with 0.3 probability, and then we have a 0.5 chance to find heads. Or we might have picked coin two, which has a probability of 1 minus 0.3, 0.7, and then chance of seeing head is 0.9. We work this all out, we get 0.78.

Screenshot taken from Udacity

Cancer Example 1

  • Let's go the cancer example. These are prior probability of cancer we should call P₀.
  • This is a probability of a positive test given cancer. I call this P₁ and careful, there's a probability of a negative test result for don't have cancer and I call this P₂.
  • Just to check suppose probability of cancer is 0.1, the sensitivity is 0.9, specificity is 0.8.
  • Given the probability that a test will come out positive. It's not Bayes rule yet, it's a simpler calculation and you should know exactly how to do this.
- P(C) = p0 = 0.1
- P(Pos|C) = p1 = 0.9
- P(Neg|not C) = p2 = 0.8

- P(C|Pos) = P(C) x P(Pos|C) = 0.1 * 0.9 = 0.09
- P(not C|Pos) = P(not C) x P(Pos|not C) = 0.9 * 0.2 = 0.18
- P(Pos) = P(C|Pos) + P(not C|Pos) = 0.27

Screenshot taken from Udacity

Calculate Total

So now I want you to write the computer code that accepts arbitrary P₀, P₁, P₂ and calculates the resulting probability of a positive test result.


In [13]:
#Calculate the probability of a positive result given that
#p0=P(C)
#p1=P(Positive|C)
#p2=P(Negative|Not C)

def f(p0,p1,p2):
    return p0 * p1 + (1-p0) * (1-p2)

print f(0.1, 0.9, 0.8)


0.27

Cancer Example 2

Let's look at the posterior probability of cancer given that we received the positive test result, and let's first do this manually for the example given up here.

- P(C|Pos) = P(C) x P(Pos|C) = 0.1 * 0.9 = 0.09
- P(not C|Pos) = P(not C) x P(Pos|not C) = 0.9 * 0.2 = 0.18
- P(Pos) = P(C|Pos) + P(not C|Pos) = 0.27
- P(C|Pos) = P(C|Pos)/P(Pos) = 0.09/0.27 = 0.8

Answer

  • And the answer is 0.0333 or a 1/3 and now we're going to apply the entire arsenal of inference we just learned about.
  • The joint probability of cancer and positive is 0.1 * 0.9. That's the joint that's not normalized.
  • So let's normalize it and we normalize it by the sum of the joint for cancer and the joint for non-cancer. Joint for cancer we just computed but the joint for non-cancer assumes the opposite prior 1-0.1 and it applies the positive result of a non-cancer case.
  • Now because the specificity first is negative, we have to do the same trick as before and multiply it with 1-0.8. When you worked this out, you find this to be 0 to 0.9 divided 0 to 0.9 + 0.9 0.2 that is 0.18
  • So if you put these all of this together, you get exactly a third

Screenshot taken from Udacity

Program Bayes Rule

  • So I want you to program this in the IDE where there are three input parameters P⁰, P¹ and P².
  • For those values, you should get a 1/3 and for those values over here, 0.01 as a prior 0.7 as sensitivity and 0.9 as specificity, you'll get 0.066 approximately. So write this code and check whether these examples work for you.

Screenshot taken from Udacity


In [16]:
#Return the probability of A conditioned on B given that 
#P(A)=p0, P(B|A)=p1, and P(Not B|Not A)=p2 

def f(p0,p1,p2):
    return p0 * p1 / (p0 * p1 + (1-p0) * (1-p2))

print f(0.1, 0.9, 0.8)
print f(0.01, 0.7, 0.9)


0.333333333333
0.0660377358491

Program Bayes Rule 2

  • Now, let's do one last modification and let's write this procedure assuming you observed a negative test result.
  • This means the posterior of having cancer under a negative result is 0.0137 for those numbers over here and about 0.00336 for those numbers over here.
  • In both cases, the posterior is significantly smaller than the prior 6because we received negative test results.

Screenshot taken from Udacity

Answer

  • And here's my implementation for the cancerous case.
  • You don't have to plug in the measurement probability to see a negative test result, which is one minus the sensitivity and in the normalizer, we copy the first term over in the second term of the noncancer hypothesize. we just put in the specificity and when you put this all together and run the procedure, we indeed get 0.013698 and so on.

In [19]:
#Return the probability of A conditioned on Not B given that 
#P(A)=p0, P(B|A)=p1, and P(Not B|Not A)=p2 

def f(p0,p1,p2):
    return p0 * (1-p1) / (p0 * (1-p1) + (1-p0) * p2)

print f(0.1, 0.9, 0.8)
print f(0.01, 0.7, 0.9)


0.013698630137
0.00335570469799